63 research outputs found

    Messy Tabletops: Clearing Up The Occlusion Problem

    Get PDF
    When introducing interactive tabletops into the home and office, lack of space will often mean that these devices play two roles: interactive display and a place for putting things. Clutter on the table surface may occlude information on the display, preventing the user from noticing it or interacting with it. We present a technique for dealing with clutter on tabletops which finds a suitable unoccluded area of the display in which to show content. We discuss the implementation of this technique and some design issues which arose during implementation

    Rhythmic Micro-Gestures: Discreet Interaction On-the-Go

    Get PDF
    We present rhythmic micro-gestures, micro-movements of the hand that are repeated in time with a rhythm. We present a user study that investigated how well users can perform rhythmic micro-gestures and if they can use them eyes-free with non-visual feedback. We found that users could successfully use our interaction technique (97% success rate across all gestures) with short interaction times, rating them as low difficulty as well. Simple audio cues that only convey the rhythm outperformed animations showing the hand movements, supporting rhythmic micro-gestures as an eyes-free input technique

    Automatically Adapting Home Lighting to Assist Visually Impaired Children

    Get PDF
    For visually impaired children, activities like finding everyday items, locating favourite toys and moving around the home can be challenging. Assisting them during these activities is important because it promotes independence and encourages them to use and develop their remaining visual function. We describe our work towards a system that adapts the lighting conditions at home to help visually impaired children with everyday tasks. We discuss scenarios that show how they may benefit from adaptive lighting, report on our progress and describe our planned future work and evaluation

    Towards a Multimodal Adaptive Lighting System for Visually Impaired Children

    Get PDF
    Visually impaired children often have difficulty with everyday activities like locating items, e.g. favourite toys, and moving safely around the home. It is important to assist them during activities like these because it can promote independence from adults and helps to develop skills. Our demonstration shows our work towards a multimodal sensing and output system that adapts the lighting conditions at home to help visually impaired children with such tasks

    Levitate: Interaction with Floating Particle Displays

    Get PDF
    This demonstration showcases the current state of the art for the levitating particle display from the Levitate Project. In this demonstration, we show a new type of display consisting of floating voxels, small levitating particles that can be positioned and moved independently in 3D space. Phased ultrasound arrays are used to acoustically levitate the particles. Users can interact directly with each particle using pointing gestures. This allows users to walk-up and interact without any user instrumentation, creating an exciting opportunity to deploy these tangible displays in public spaces in the future. This demonstration explores the design potential of floating voxels and how these may be used to create new types of user interfaces

    Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems

    Get PDF
    When users want to interact with an in-air gesture system, they must first address it. This involves finding where to gesture so that their actions can be sensed, and how to direct their input towards that system so that they do not also affect others or cause unwanted effects. This is an important problem [6] which lacks a practical solution. We present an interaction technique which uses multimodal feedback to help users address in-air gesture systems. The feedback tells them how (“do that”) and where (“there”) to gesture, using light, audio and tactile displays. By doing that there, users can direct their input to the system they wish to interact with, in a place where their gestures can be sensed. We discuss the design of our technique and three experiments investigating its use, finding that users can “do that” well (93.2%–99.9%) while accurately (51mm–80mm) and quickly (3.7s) finding “there”

    Levitating Particle Displays with Interactive Voxels

    Get PDF
    Levitating objects can be used as the primitives in a new type of display. We present levitating particle displays and show how research into object levitation is enabling a new way of presenting and interacting with information. We identify novel properties of levitating particle displays and give examples of the interaction techniques and applications they allow. We then discuss design challenges for these displays, potential solutions, and promising areas for future research

    Floating Widgets: Interaction with Acoustically-Levitated Widgets

    Get PDF
    Acoustic levitation enables new types of human-computer interface, where the content that users interact with is made up from small objects held in mid-air. We show that acoustically-levitated objects can form mid-air widgets that respond to interaction. Users can interact with them using in-air hand gestures. Sound and widget movement are used as feedback about the interaction

    Textured Surfaces for Ultrasound Haptic Displays

    Get PDF
    We demonstrate a technique for rendering textured haptic surfaces in mid-air, using an ultrasound haptic display. Our technique renders tessellated 3D `haptic' shapes with different waveform properties, creating surfaces with distinct perceptions

    Towards usable and acceptable above-device interactions

    Get PDF
    Gestures above a mobile phone would let users interact with their devices quickly and easily from a distance. While both researchers and smartphone manufacturers develop new gesture sensing technologies, little is known about how best to design these gestures and interaction techniques. Our research looks at creating usable and socially acceptable above-device interaction techniques. We present an initial gesture collection, a preliminary evaluation of these gestures and some design recommendations. Our findings identify interesting areas for future research and will help designers create better gesture interfaces
    • …
    corecore